Multi-fidelity sparse polynomial chaos expansion based on Gaussian process regression and least angle regression
نویسندگان
چکیده
منابع مشابه
Adaptive sparse polynomial chaos expansion based on least angle regression
Polynomial chaos (PC) expansions are used in stochastic finite element analysis to represent the random model response by a set of coefficients in a suitable (so-called polynomial chaos) basis. The number of terms to be computed grows dramatically with the size of the input random vector, which makes the computational cost of classical solution schemes (may it be intrusive (i.e. of Galerkin typ...
متن کاملSparse Greedy Gaussian Process Regression
Peter Bartlett RSISE Australian National University Canberra, ACT, 0200 [email protected] We present a simple sparse greedy technique to approximate the maximum a posteriori estimate of Gaussian Processes with much improved scaling behaviour in the sample size m. In particular, computational requirements are O(n2m), storage is O(nm), the cost for prediction is 0 ( n) and the cost to com...
متن کاملSparse Spectrum Gaussian Process Regression
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsify the spectral representation of the GP. This leads to a simple, practical algorithm for regression tasks. We compare the achievable trade-offs between predictive accuracy and computational requirements, and show that these are typically superior to existing state-of-the-art sparse approximations...
متن کاملMulti-fidelity Gaussian process regression for prediction of random fields
Article history: Received 17 July 2016 Received in revised form 10 November 2016 Accepted 23 January 2017 Available online xxxx
متن کاملIncremental Variational Sparse Gaussian Process Regression
Recent work on scaling up Gaussian process regression (GPR) to large datasets has primarily focused on sparse GPR, which leverages a small set of basis functions to approximate the full Gaussian process during inference. However, the majority of these approaches are batch methods that operate on the entire training dataset at once, precluding the use of datasets that are streaming or too large ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Physics: Conference Series
سال: 2021
ISSN: 1742-6588,1742-6596
DOI: 10.1088/1742-6596/1730/1/012091